Goto

Collaborating Authors

 mlop engineer


Staff MLOps Engineer at Tractable - London, UK

#artificialintelligence

Tractable is an Artificial Intelligence company bringing the speed and insight of Applied AI to visual assessment. Trained on millions of data points, our AI-powered solutions connect everyone involved in insurance, repairs, and sales of homes and cars – helping people work faster and smarter, while reducing friction and waste. Founded in 2014, Tractable is now the AI tool of choice for world-leading insurance and automotive companies. Our solutions unlock the potential of Applied AI to transform the whole recovery ecosystem, from assessing damage and accelerating claims and repairs to recycling parts. They help make response to recovery up to ten times faster – even after full-scale disasters like floods and hurricanes.


MLOps Engineer at Hala Systems - Remote, EU and US

#artificialintelligence

Hala Systems, Inc. is a social enterprise working to transform the nature of protection and accountability in the world's toughest places by democratizing advanced defense, sensing, and artificial intelligence technology. Hala is currently saving lives, reducing trauma, and improving resilience for millions of people. Our team works across the globe and hails from over 15 countries. We speak over 20 languages and have studied and worked in leading educational, business, research and government institutions. We are mission-driven thinkers, and we share a deep respect for each other and for the communities that partner with us.


MLOps Engineer at Hala Systems - US east coast, EU

#artificialintelligence

Hala Systems, Inc. is a social enterprise working to transform the nature of protection and accountability in the world's toughest places by democratizing advanced defense, sensing, and artificial intelligence technology. Hala is currently saving lives, reducing trauma, and improving resilience for millions of people. Our team works across the globe and hails from over 15 countries. We speak over 20 languages and have studied and worked in leading educational, business, research and government institutions. We are mission-driven thinkers, and we share a deep respect for each other and for the communities that partner with us.


MLOps Engineer

#artificialintelligence

As an MLOps Engineer, you'll know how to engineer beautiful code in Python and take pride in what you produce. You'll be an advocate of high-quality engineering and best-practice in production software as well as rapid prototypes. Whilst the position is a hands-on technical role, we'd be particularly interested to find candidates with a desire to lead projects and take an active role in leading client discussions. Your responsibilities will involve building trusted relationships with prospects, finding creative ways to use machine learning to solve problems, scoping projects, and overseeing the delivery of these engagements. To be successful, you will need an understanding of ML & Data Science fundamentals, as well as best software engineering practices such as automated testing and CI/CD.


What is an MLOps Engineer? - KDnuggets

#artificialintelligence

MLOps is a relatively new term to the data industry. In the past, companies solely focused on hiring data scientists and machine learning practitioners. These individuals could build predictive models that helped companies automate workflows and make key decisions. Over time, however, machine learning projects started to cause organizations more harm than good. They failed when put into production, leading to missed business opportunities and unhappy clients.


The current state of MLOps for machine learning engineers

#artificialintelligence

This article was contributed by Aymane Hachcham, data scientist and contributor to neptune.ai MLOps refers to the operation of machine learning in production. It combines DevOps with lifecycle tracking, reusable infrastructure, and reproducible environments to operationalize machine learning at scale across an entire organization. The term MLOps was first coined by Google in their paper on Machine Learning Operations, although it does have roots in software operations. Google's goal with this paper was to introduce a new approach to developing AI products that is more agile, collaborative, and customer-centric.


Announcing Amazon SageMaker Inference Recommender

#artificialintelligence

Today, we're pleased to announce Amazon SageMaker Inference Recommender -- a brand-new Amazon SageMaker Studio capability that automates load testing and optimizes model performance across machine learning (ML) instances. Ultimately, it reduces the time it takes to get ML models from development to production and optimizes the costs associated with their operation. Until now, no service has provided MLOps Engineers with a means to pick the optimal ML instances for their model. To optimize costs and maximize instance utilization, MLOps engineers would have to use their experience and intuition to select an ML instance type that would serve them and their model well, given the requirements to run them. Moreover, given the vast array of ML instances available, and the practically infinite nuances of each model, choosing the right instance type could take more than a few attempts to get it right.


In the 8 Key MLOps Roles, Where Do You Fit In?

#artificialintelligence

Most people don't realize that business analysts (BA) are part of the data science team. Yet, their contribution is the most critical part of machine learning operations. They play a translator role between the business stakeholders and the technical team. They specialize in speaking the language of both worlds. BAs help the technical team to break down the business problem into actionable machine learning problems.